-
1 eigenvector decomposition
разложение ( матрицы) по собственным векторам (представление матрицы в виде произведения двух сопряжённых матриц, столбцы первой из которых представляют собой произведение соответствующего собственного вектора на корень из отвечающего ему собственного значения)Англо-русский словарь промышленной и научной лексики > eigenvector decomposition
-
2 разложение по собственным векторам
Makarov: eigen decomposition, eigenvector decompositionУниверсальный русско-английский словарь > разложение по собственным векторам
См. также в других словарях:
Eigenvalue, eigenvector and eigenspace — In mathematics, given a linear transformation, an Audio|De eigenvector.ogg|eigenvector of that linear transformation is a nonzero vector which, when that transformation is applied to it, changes in length, but not direction. For each eigenvector… … Wikipedia
Schur decomposition — In the mathematical discipline of linear algebra, the Schur decomposition or Schur triangulation (named after Issai Schur) is an important matrix decomposition. Statement The Schur decomposition reads as follows: if A is a n times; n square… … Wikipedia
Dynamic mode decomposition — Physical systems, such as fluid flow or mechanical vibrations, behave in characteristic patterns, known as modes. In a recirculating flow, for example, one may think of a hierarchy of vortices, a big main vortex driving smaller secondary ones and … Wikipedia
Pisarenko harmonic decomposition — Pisarenko harmonic decomposition, also referred to as Pisarenko s method, is a method of frequency estimation [Hayes, Monson H., Statistical Digital Signal Processing and Modeling , John Wiley Sons, Inc., 1996. ISBN 0 471 59431 8.] . This method… … Wikipedia
Choi's theorem on completely positive maps — In mathematics, Choi s theorem on completely positive maps (after Man Duen Choi) is a result that classifies completely positive maps between finite dimensional (matrix) C* algebras. An infinite dimensional algebraic generalization of Choi s… … Wikipedia
Eigenvalues and eigenvectors — For more specific information regarding the eigenvalues and eigenvectors of matrices, see Eigendecomposition of a matrix. In this shear mapping the red arrow changes direction but the blue arrow does not. Therefore the blue arrow is an… … Wikipedia
Principal component analysis — PCA of a multivariate Gaussian distribution centered at (1,3) with a standard deviation of 3 in roughly the (0.878, 0.478) direction and of 1 in the orthogonal direction. The vectors shown are the eigenvectors of the covariance matrix scaled by… … Wikipedia
Compact operator on Hilbert space — In functional analysis, compact operators on Hilbert spaces are a direct extension of matrices: in the Hilbert spaces, they are precisely the closure of finite rank operators in the uniform operator topology. As such, results from matrix theory… … Wikipedia
Principal components analysis — Principal component analysis (PCA) is a vector space transform often used to reduce multidimensional data sets to lower dimensions for analysis. Depending on the field of application, it is also named the discrete Karhunen Loève transform (KLT),… … Wikipedia
Spectral theorem — In mathematics, particularly linear algebra and functional analysis, the spectral theorem is any of a number of results about linear operators or about matrices. In broad terms the spectral theorem provides conditions under which an operator or a … Wikipedia
Perron–Frobenius theorem — In linear algebra, the Perron–Frobenius theorem, proved by Oskar Perron (1907) and Georg Frobenius (1912), asserts that a real square matrix with positive entries has a unique largest real eigenvalue and that the corresponding… … Wikipedia